Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Academic research is largely an adult endeavor that creates systemic power imbalances when studying teen-centered topics, such as adolescent online safety. To rectify this problem, we engaged seven teens as co-researchers through a year-and-a-half-long Youth Advisory Board (YAB) program to critically assess our research processes, lead online safety solutions, and to reflect on their experiences participating in a YAB. Teens pushed back on standard research practices such as parental consent, sought decision-making power in study documentation, design, and execution, and gave more meaningful feedback on research protocols when more deeply involved in the research. For safety interventions, teens proposed both incremental changes for social media platforms (e.g., advanced privacy settings) and more disruptive changes (e.g., decentralized social media platforms) that enhance individual control, digital resilience, and equity. For the YAB, teens highlighted challenges, such as losing momentum over time, lack of collaborative opportunities, and competing interests, fueling frustrations and rifts in engagement. Our research underscores the value of involving teens as co-partners in shaping online safety research. Finally, we provide design implications for social media safety interventions that strengthen teens' agency and actionable guidelines for developing future long-term programs to ensure meaningful contributions to online safety research.more » « less
-
On social media, teens must manage their interpersonal boundaries not only with other people, but also with the algorithms embedded in these platforms. In this context, we engaged seven teens in an Asynchronous Remote Community (ARC) as part of a multi-year Youth Advisory Board (YAB) to discuss how they navigate, cope, and co-design for improved boundary management. Teens had preconceived notions of different platforms and navigated boundaries based on specific goals; yet, they struggled when platforms lacked the granular controls needed to meet their needs. Teens enjoyed the personalization afforded by algorithms, but they felt violated when algorithms pushed unwanted content. Teens designed features for enhanced control over their discoverability and for real-time risk detection to avoid boundary turbulence. We provide design guidelines for improved social media boundary management for youth and pinpoint educational opportunities to enhance teens’ understanding and use of social media privacy settings and algorithms.more » « less
-
Researchers across various fields have investigated how users experience moderation through different perspectives and methodologies. At present, there is a pressing need of synthesizing and extracting key insights from prior literature to formulate a systematic understanding of what constitutes a moderation experience and to explore how such understanding could further inform moderation-related research and practices. To answer this question, we conducted a systematic literature review (SLR) by analyzing 42 empirical studies related to moderation experiences and published between January 2016 and March 2022. We describe these studies' characteristics and how they characterize users' moderation experiences. We further identify five primary perspectives that prior researchers use to conceptualize moderation experiences. These findings suggest an expansive scope of research interests in understanding moderation experiences and considering moderated users as an important stakeholder group to reflect on current moderation design but also pertain to the dominance of the punitive, solutionist logic in moderation and ample implications for future moderation research, design, and practice.more » « less
-
Transparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid to unpacking what transparency entails in moderation design, especially for content creators. We interviewed 28 YouTubers to understand their moderation experiences and analyze the dimensions of moderation transparency. We identified four primary dimensions: participants desired the moderation system to present moderation decisions saliently, explain the decisions profoundly, afford communication with the users effectively, and offer repairment and learning opportunities. We discuss how these four dimensions are mutually constitutive and conditioned in the context of creator moderation, where the target of governance mechanisms extends beyond the content to creator careers. We then elaborate on how a dynamic, transparency perspective could value content creators' digital labor, how transparency design could support creators' learning, as well as implications for transparency design of other creator platforms.more » « less
-
How social media platforms could fairly conduct content moderation is gaining attention from society at large. Researchers from HCI and CSCW have investigated whether certain factors could affect how users perceive moderation decisions as fair or unfair. However, little attention has been paid to unpacking or elaborating on the formation processes of users' perceived (un)fairness from their moderation experiences, especially users who monetize their content. By interviewing 21 for-profit YouTubers (i.e., video content creators), we found three primary ways through which participants assess moderation fairness, including equality across their peers, consistency across moderation decisions and policies, and their voice in algorithmic visibility decision-making processes. Building upon the findings, we discuss how our participants' fairness perceptions demonstrate a multi-dimensional notion of moderation fairness and how YouTube implements an algorithmic assemblage to moderate YouTubers. We derive translatable design considerations for a fairer moderation system on platforms affording creator monetization.more » « less
An official website of the United States government
